Riemannian Sparse Coding for Positive Definite Matrices
نویسندگان
چکیده
Inspired by the great success of sparse coding for vector valued data, our goal is to represent symmetric positive definite (SPD) data matrices as sparse linear combinations of atoms from a dictionary, where each atom itself is an SPD matrix. Since SPD matrices follow a non-Euclidean (in fact a Riemannian) geometry, existing sparse coding techniques for Euclidean data cannot be directly extended. Prior works have approached this problem by defining a sparse coding loss function using either extrinsic similarity measures (such as the log-Euclidean distance) or kernelized variants of statistical measures (such as the Stein divergence, Jeffrey’s divergence, etc.). In contrast, we propose to use the intrinsic Riemannian distance on the manifold of SPD matrices. Our main contribution is a novel mathematical model for sparse coding of SPD matrices; we also present a computationally simple algorithm for optimizing our model. Experiments on several computer vision datasets showcase superior classification and retrieval performance compared with state-of-the-art approaches.
منابع مشابه
Sparse Coding and Dictionary Learning for Symmetric Positive Definite Matrices: A Kernel Approach
Recent advances suggest that a wide range of computer vision problems can be addressed more appropriately by considering non-Euclidean geometry. This paper tackles the problem of sparse coding and dictionary learning in the space of symmetric positive definite matrices, which form a Riemannian manifold. With the aid of the recently introduced Stein kernel (related to a symmetric version of Breg...
متن کاملPositive Definite Matrices: Data Representation and Applications to Computer Vision
Numerous applications in computer vision and machine learning rely on representations of data that are compact, discriminative, and robust while satisfying several desirable invariances. One such recently successful representation is offered by symmetric positive definite (SPD) matrices. However, the modeling power of SPD matrices comes at a price: rather than a flat Euclidean view, SPD matrice...
متن کاملGeneralized Dictionary Learning for Symmetric Positive Definite Matrices with Application to Nearest Neighbor Retrieval
We introduceGeneralized Dictionary Learning (GDL), a simple but practical framework for learning dictionaries over the manifold of positive definite matrices. We illustrate GDL by applying it to Nearest Neighbor (NN) retrieval, a task of fundamental importance in disciplines such as machine learning and computer vision. GDL distinguishes itself from traditional dictionary learning approaches by...
متن کاملOptimized Kernel-based Projection Space of Riemannian Manifolds
Recent advances in computer vision suggest that encoding images through Symmetric Positive Definite (SPD) matrices can lead to increased classification performance. Taking into account manifold geometry is typically done via embedding the manifolds in tangent spaces, or Reproducing Kernel Hilbert Spaces (RKHS). Recently it was shown that projecting such manifolds into a kernel-based random proj...
متن کاملRiemannian Multi-Manifold Modeling
This paper advocates a novel framework for segmenting a dataset in a Riemannian manifold M into clusters lying around low-dimensional submanifolds of M . Important examples of M , for which the proposed clustering algorithm is computationally efficient, are the sphere, the set of positive definite matrices, and the Grassmannian. The clustering problem with these examples of M is already useful ...
متن کامل